Skip to main content

Data Integration

Multi-source Connectivity

Synapsis Analysis is designed to integrate data from a wide range of heterogeneous sources, enabling seamless interoperability across different systems and environments.

The platform supports multiple types of data sources, including:

  • Relational databases (e.g., PostgreSQL, MySQL, SQL Server)
  • NoSQL databases, for handling unstructured or semi-structured data
  • Structured files such as CSV and JSON
  • Data warehouses, for large-scale analytical workloads
  • IoT streams and time-series data, supporting high-frequency telemetry ingestion

This multi-source connectivity allows organizations to consolidate distributed data without requiring changes to existing infrastructure, creating a unified and consistent data layer for analysis.

Data Modeling

Synapsis Analysis provides flexible data modeling capabilities that enable users to structure and organize data according to analytical needs.

Key features include:

  • Logical datasets (virtual datasets), allowing abstraction from physical data sources
  • Joins across heterogeneous sources, enabling correlation between different data domains
  • Creation of views and aggregations, to simplify complex data structures
  • Semantic data abstraction, improving data readability and usability for analytical purposes

These capabilities allow users to build coherent and reusable data models, facilitating efficient exploration and analysis of complex datasets.

ETL / ELT Pipelines

The platform supports advanced ETL (Extract, Transform, Load) and ELT (Extract, Load, Transform) processes, enabling the transformation and preparation of data for analysis.

Through configurable pipelines, users can perform:

  • Data transformations and filtering, to clean and standardize incoming data
  • Temporal aggregations, particularly suited for time-series and IoT data
  • Computation of derived KPIs, based on business logic and analytical requirements
  • Scheduled or real-time execution, depending on operational needs

These pipelines ensure that data is consistently processed, enriched, and made available for downstream analytics in both real-time and batch scenarios.